翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Rényi's divergence : ウィキペディア英語版
Rényi entropy
In information theory, the Rényi entropy generalizes the Hartley entropy, the Shannon entropy, the collision entropy and the min-entropy. Entropies quantify the diversity, uncertainty, or randomness of a system. The Rényi entropy is named after Alfréd Rényi.
The Rényi entropy is important in ecology and statistics as indices of diversity. The Rényi entropy is also important in quantum information, where it can be used as a measure of entanglement. In the Heisenberg XY spin chain model, the Rényi entropy as a function of ''α'' can be calculated explicitly by virtue of the fact that it is an automorphic function with respect to a particular subgroup of the modular group. In theoretical computer science, the min-entropy is used in the context of randomness extractors.
== Definition ==
The Rényi entropy of order \alpha, where \alpha \geq 0 and \alpha \neq 1 , is defined as
:H_\alpha(X) = \frac\log\Bigg(\sum_^n p_i^\alpha\Bigg) .〔
Here, X is a discrete random variable with possible outcomes 1,2,...,n and corresponding probabilities p_i \doteq \Pr(X=i) for i=1,\dots,n, and the logarithm is base 2.
If the probabilities are p_i=1/n for all i=1,\dots,n, then all the Rényi entropies of the distribution are equal: H_\alpha(X)=\log n.
In general, for all discrete random variables X, H_\alpha(X) is a non-increasing function in \alpha.
Applications often exploit the following relation between the Rényi entropy and the ''p''-norm of the vector of probabilities:
:H_\alpha(X)=\frac \log \left(\|P\|_\alpha\right) .
Here, the discrete probability distribution P=(p_1,\dots,p_n) is interpreted as a vector in \R^n with p_i\geq 0 and \sum_^ p_i =1.
The Rényi entropy for any \alpha \geq 0 is Schur concave.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Rényi entropy」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.